Weight Initialization-based Partial Training Algorithm
نویسندگان
چکیده
The deep learning or deep neural network (DNN) currently provide the best solution to many problems in classification. However, DNN requires much more time and many calculations than existing other classification algorithms. In addition, if new features and classes are added into existing model, it should re-learn all of data set in order to apply the new features and classes into learned model. And it consumes a lot of computing powers and time due to the characteristics of DNN. To resolve this problem, in this paper, we proposed partial-learning algorithm. The proposed algorithm allows to learn new features and classes without whole re-learning of dataset. For this, the proposed algorithm creates latent matrices for new weight parameters through extension of weight matrices in the existing model. Finally, we shown possibility of the proposed algorithm via experiment.
منابع مشابه
A new two-step learning vector quantization algorithm for image compression
The learning vector quantization (LVQ) algorithm is widely used in image compression because of its intuitively clear learning process and simple implementation. However, LVQ strongly depends on the initialization of the codebook and often converges to local optimal results. To address the issues, a new two-step LVQ (TsLVQ) algorithm is proposed in the paper. TsLVQ uses a correcting learning st...
متن کاملWeight initialization methods for multilayer feedforward
In this paper, we present the results of an experimental comparison among seven different weight initialization methods in twelve different problems. The comparison is performed by measuring the speed of convergence, the generalization capability and the probability of successful convergence. It is not usual to find an evaluation of the three properties in the papers on weight initialization. T...
متن کاملPerformance analysis of a MLP weight initialization algorithm
The determination of the initial weights is an important issue in multilayer perceptron design. Recently, we have proposed a new approach to weight initialization based on discriminant analysis techniques. In this paper, the performances of multilayer perceptrons (MLPs) initialized by non-parametric discriminant analysis are compared to those of randomly initialized MLPs using several synthetic...
متن کاملAccurate Initialization of Neural Network Weights
Abstracf Proper initialization of neural networks is critical for a successful training of its weights. Many methods have been proposed to achieve this, including heuristic least squares approaches. In this paper, inspired by these previous attempts to train (or initialize) neural networks, we formulate a mathematically sound algorithm based on backpropagating the desired output through the lay...
متن کاملAccurate Initialization of Neural Network Weights by Backpropagation of the Desired Response
Proper initialization of neural networks is critical for a successful training of its weights. Many methods have been proposed to achieve this, including heuristic least squares approaches. In this paper, inspired by these previous attempts to train (or initialize) neural networks, we formulate a mathematically sound algorithm based on backpropagating the desired output through the layers of a ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016